home *** CD-ROM | disk | FTP | other *** search
- NEURAL NET UTILITIES
- version 1.01
- by Gregory Stevens (stevens@prodigal.psych.rochester.edu)
-
- (implementation documentation)
- (These utilities are all ANSI compliant)
-
- To make a neural net utility, the following files are necessary:
-
- NNPARAMS.C (parameter definitions for the size of the net)
- NNINPUTS.C (definitions for defining and loading input patterns)
- NNSTRUCT.C (definition for the net data type and init function)
- NNDISPLY.C (functions for displaying information on the screen)
- NNLOADIN.C (functions for loading inputs into the net and processing)
-
- One of the following must be included as the head of the nnutils list, and
- will chain to the rest. Which to use depends on whether Back Propagation,
- Hebbian learning, or Competative learning with the Hebb rule is used.
-
- NNBKPROP.C (functions for error and backpropagation algorithm)
- NNHEBBLN.C (functions for error and Hebbian learning)
- NNCOMPET.C (stuff for all-or-none clusters [also uses nnhebbln])
- NNRECURN.C (stuff for a recurrent net [also uses nnbkprop.c])
-
- You will also need to have or make the files:
-
- NNINPUTS.DAT (text file with values for the input patterns)
- NNOUTPUT.DAT (this will be needed IF the routine is supervised learning)
- NNINTEST.DAT (this will be needed IF you are testing unlearned patterns)
-
- NNINPUTS.DAT needs a set of input patterns where there are a number of patterns
- equal to NUM_PATTERNS as defined in nninputs.c, and the number of elements
- per pattern is equal to INPUT_LAYER_SIZE as defined in nnparams.c.
-
- NNOUTPUT.DAT needs a set of output patterns where there are a number of
- patterns equal to NUM_PATTERNS and elements per pattern equal to
- OUTPUT_LAYER_SIZE.
-
- NNINTEST.DAT needs to be equal in size and format to NNINPUTS.DAT.
-
- In your main file you will only need to include the file, NNBKPROP.C,
- because all the others are linked with includes, if it is supervised learning,
- or the NNHEBBLN.C if unsupervised, or NNCOMPET.C if competative unsupervised.
-
- Map of the data type: Each node has: a real activation state value
- a list of real-valued weights
- to nodes in the previous layer
- a real-valued threshhold weight
- an integer flag for the output
- function: 0=linear, 1=logistic
-
- The network is: a structure which has a single
- field, which is an array of
- layers by nodes of type unit.
-
-
- User-Accessable Variables and Functions defined in each file:
-
- NNPARAMS.C : INPUT_LAYER_SIZE (size of input layer)
- OUTPUT_LAYER_SIZE (size of output layer)
- NUM_HIDDEN_LAYERS (number hidden layers)
- HL_SIZE_# (size of each hidden layer)
-
- NUMLAYERS (number of layers total)
- MAXNODES (number of nodes in largest layer)
-
- NNINPUTS.C : NUM_PATTERNS (number of input patterns in file)
- InitInPatterns() (loads patterns from file )
-
- NNSTRUCT.C : NUMNODES[] (number of nodes in each layer )
- InitNet() (initializes network )
-
- NNDISPLY.C : DisplayLayer()
-
- NNLOADIN.C : UpDateInputAct() (loads pattern into the input layer)
- UpDateLayerAct() (updates act'tions for a [non-input] layer)
-
- NNBKPROP.C : EPSILON (increment for weight changes)
- InitOutPatterns() (loads in the appropriate output patterns)
- GetDerivs() (finds dE/ds for each unit; NOT FOR MAIN)
- UpDateWeightandThresh() (does what it says)
-
- NNHEBBLN.C : UpDateWeightandThresh() (does what it says with Hebb rule)
-
- NNCOMPET.C : AllOrNoneLayerActs() (sets only one node to winner)
- NUM_CLUSTERS_# (number of clusters in a given node)
-
- User-Modifiable Values in each file:
-
- NNPARAMS.C : INPUT_LAYER_SIZE (size of input layer)
- OUTPUT_LAYER_SIZE (size of output layer)
- NUM_HIDDEN_LAYERS (number hidden layers)
- HL_SIZE_# (size of each hidden layer)
-
- NNINPUTS.C : NUM_PATTERNS (number of input patterns in file)
-
- NNSTRUCT.C : InitNet() (change how the net is initialized)
-
- NNBKPROP.C : EPSILON (increment for weight changes)
-
- NNCOMPET.C : NUM_CLUSTERS_# (number of clusters in a given layer)
-
- In the main code, the following variables should be used (not necessarily
- with these variable names):
-
- NNETtype net; /* variable for holding the network */
- PATTERNtype InPatterns, /* holding input training patterns */
- OutPattern; /* holding output "correct" responses */
-
- The following kinds of initialization assignments should appear:
-
- net = InitNet( NUMNODES );
- InPatterns = InitInPatterns();
- OutPattern = InitOutPatterns();
-
- If the backpropagation algorithm will be used, somewhere in the code the
- following line should appear:
-
- net = UpDateWeightandThresh( net, OutPattern, pattern );
-
- where pattern is a looping integer index marking which input pattern is
- the current one.
-
- If Hebbian learning is used (means either using HEBBLN or COMPET):
-
- net = UpDatWeightandThresh( net );
-
- If competative learning is to be implemented, in the loop where you
- UpDateLayerActs, follow that function call with
-
- net = AllOrNoneLayerActs( net, Layer)
-
- All these are hints for writing your own from scratch, but I recommend copying
- the demo application code that is most similar to your paradigm and modifying
- a copy. The basic loops will most likely be the same.
-
- greg
-
- stevens@prodigal.psych.rochester.edu
-